Bayesian Exponential Family Harmoniums

نویسندگان

  • Fan Guo
  • Eric P. Xing
چکیده

A Bayesian Exponential Family Harmonium (BEFH) model is presented for topical modeling of text and multimedia data, and for “posterior” latent semantic projection of such data for subsequent data mining tasks. BEFHs are a Bayesian approach to inference and learning with the recently proposed EFH models and their variants, which enables smoothed, robust estimation of the topicattribute coupling coefficients that are reminiscent of the smoothed topical word-probabilities in the latent Dirichlet Allocation (LDA) model. The Langevin algorithm conjoint with an MCMC scheme is applied for posterior inference with BEFH. An empirical Bayes method is also developed to estimate the hyperparameters.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Exponential Family Harmoniums with an Application to Information Retrieval

Directed graphical models with one layer of observed random variables and one or more layers of hidden random variables have been the dominant modelling paradigm in many research fields. Although this approach has met with considerable success, the causal semantics of these models can make it difficult to infer the posterior distribution over the hidden variables. In this paper we propose an al...

متن کامل

Recurrent Exponential-Family Harmoniums without Backprop-Through-Time

Exponential-family harmoniums (EFHs), which extend restricted Boltzmann machines (RBMs) from Bernoulli random variables to other exponential families (Welling et al., 2005), are generative models that can be trained with unsupervised-learning techniques, like contrastive divergence (Hinton et al., 2006; Hinton, 2002), as density estimators for static data. Methods for extending RBMs—and likewis...

متن کامل

MMH: Maximum Margin Supervised Harmoniums

Exponential family Harmoniums (EFH) are undirected topic models that enjoy nice properties such as fast inference compared to directed topic models. Supervised EFHs can utilize documents’ side information for discovering predictive latent topic representations. However, existing likelihood based estimation does not yield conclusive results. This paper presents a max-margin approach to learning ...

متن کامل

Stochastic Neural Networks with Monotonic Activation Functions

We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of t...

متن کامل

Invariant Empirical Bayes Confidence Interval for Mean Vector of Normal Distribution and its Generalization for Exponential Family

Based on a given Bayesian model of multivariate normal with  known variance matrix we will find an empirical Bayes confidence interval for the mean vector components which have normal distribution. We will find this empirical Bayes confidence interval as a conditional form on ancillary statistic. In both cases (i.e.  conditional and unconditional empirical Bayes confidence interval), the empiri...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004